11 research outputs found

    New results on Higgs boson properties

    Full text link
    We present the latest ATLAS and CMS measurements of several properties of the Higgs boson, such as signal-strength modifiers for the main production modes, fiducial and differential cross sections, and the Higgs mass. We have analyzed the 13 TeV proton-proton LHC collision data recorded in 2016, corresponding to integrated luminosities up to 36.1 fb136.1~{\rm fb}^{-1}. Results for the HZZ4{\rm H\to ZZ}\to 4\ell (=eμ{\rm \ell = e\mu}), Hγγ{\rm H}\to\gamma\gamma, and Hττ{\rm H}\to\tau\tau decay channels are presented. In addition, searches for new phenomena in the Hγγ+ETmiss{\rm H}\to\gamma\gamma + E_{\rm T}^{\rm miss} and Hbbˉ+ETmiss{\rm H}\to{\rm b\bar{b}} + E_{\rm T}^{\rm miss} decay channels are presented

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    The CMS data transfer test environment in preparation for LHC data taking

    No full text
    The CMS experiment is preparing for LHC data taking in several computing preparation activities. In distributed data transfer tests, in early 2007 a traffic load generator infras-tructure was designed and deployed, to equip the WLCG Tiers which support the CMS Virtual Organization with a means for debugging, load-testing and commissioning data transfer routes among CMS Computing Centres. The LoadTest is based upon PhEDEx as a reliable, scalable dataset replication system. In addition, a Debugging Data Transfers (DDT) Task Force was created to coordinate the debugging of data transfer links in the preparation period and during the Computing Software and Analysis challenge in 2007 (CSA07). The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are ach eved for that link. The goal was to deliver one by one working transfer routes to Data Operations. The experiences with the overall test trans-fers infrastructure within computing challenges - as in the WLCG Common-VO Computing Readiness Challenge (CCRC'08) - as well as in daily testing and debugging activities are reviewed and discussed, and plans for the future are presented. ©2008 IEEE.SCOPUS: cp.pinfo:eu-repo/semantics/publishe

    Searching for long-lived particles beyond the Standard Model at the Large Hadron Collider

    No full text

    Searching for long-lived particles beyond the Standard Model at the Large Hadron Collider

    Get PDF
    International audienceParticles beyond the Standard Model (SM) can generically have lifetimes that are long compared to SM particles at the weak scale. When produced at experiments such as the Large Hadron Collider (LHC) at CERN, these long-lived particles (LLPs) can decay far from the interaction vertex of the primary proton–proton collision. Such LLP signatures are distinct from those of promptly decaying particles that are targeted by the majority of searches for new physics at the LHC, often requiring customized techniques to identify, for example, significantly displaced decay vertices, tracks with atypical properties, and short track segments. Given their non-standard nature, a comprehensive overview of LLP signatures at the LHC is beneficial to ensure that possible avenues of the discovery of new physics are not overlooked. Here we report on the joint work of a community of theorists and experimentalists with the ATLAS, CMS, and LHCb experiments—as well as those working on dedicated experiments such as MoEDAL, milliQan, MATHUSLA, CODEX-b, and FASER—to survey the current state of LLP searches at the LHC, and to chart a path for the development of LLP searches into the future, both in the upcoming Run 3 and at the high-luminosity LHC. The work is organized around the current and future potential capabilities of LHC experiments to generally discover new LLPs, and takes a signature-based approach to surveying classes of models that give rise to LLPs rather than emphasizing any particular theory motivation. We develop a set of simplified models; assess the coverage of current searches; document known, often unexpected backgrounds; explore the capabilities of proposed detector upgrades; provide recommendations for the presentation of search results; and look towards the newest frontiers, namely high-multiplicity ‘dark showers’, highlighting opportunities for expanding the LHC reach for these signals

    A Roadmap for HEP Software and Computing R&D for the 2020s

    No full text
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade

    The International Linear Collider: Report to Snowmass 2021

    No full text
    International audienceThe International Linear Collider (ILC) is on the table now as a new global energy-frontier accelerator laboratory taking data in the 2030s. The ILC addresses key questions for our current understanding of particle physics. It is based on a proven accelerator technology. Its experiments will challenge the Standard Model of particle physics and will provide a new window to look beyond it. This document brings the story of the ILC up to date, emphasizing its strong physics motivation, its readiness for construction, and the opportunity it presents to the US and the global particle physics community

    The International Linear Collider: Report to Snowmass 2021

    No full text
    International audienceThe International Linear Collider (ILC) is on the table now as a new global energy-frontier accelerator laboratory taking data in the 2030s. The ILC addresses key questions for our current understanding of particle physics. It is based on a proven accelerator technology. Its experiments will challenge the Standard Model of particle physics and will provide a new window to look beyond it. This document brings the story of the ILC up to date, emphasizing its strong physics motivation, its readiness for construction, and the opportunity it presents to the US and the global particle physics community
    corecore